Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
We introduce a novel method for extracting a fragmentation model directly from experimental data without requiring an explicit parametric form, called Histories and Observables for Monte-Carlo Event Reweighting (HOMER), consisting of three steps: the training of a classifier between simulation and data, the inference of single fragmentation weights, and the calculation of the weight for the full hadronization chain. We illustrate the use of HOMER on a simplified hadronization problem, aq\bar{q} string fragmenting into pions, and extract a modified Lund string fragmentation functionf(z) . We then demonstrate the use of HOMER on three types of experimental data: (i) binned distributions of high-level observables, (ii) unbinned event-by-event distributions of these observables, and (iii) full particle cloud information. After demonstrating thatf(z) can be extracted from data (the inverse of hadronization), we also show that, at least in this limited setup, the fidelity of the extractedf(z) suffers only limited loss when moving from (i) to (ii) to (iii). Public code is available at https://gitlab.com/uchep/mlhad.more » « lessFree, publicly-accessible full text available February 17, 2026
-
We introduce a model of hadronization based on invertible neural networks that faithfully reproduces a simplified version of the Lund string model for meson hadronization. Additionally, we introduce a new training method for normalizing flows, termed MAGIC, that improves the agreement between simulated and experimental distributions of high-level (macroscopic) observables by adjusting single-emission (microscopic) dynamics. Our results constitute an important step toward realizing a machine-learning based model of hadronization that utilizes experimental data during training. Finally, we demonstrate how a Bayesian extension to this normalizing-flow architecture can be used to provide analysis of statistical and modeling uncertainties on the generated observable distributions.more » « less
-
Abstract Measurements of quarkonia isolation in jets at the Large Hadron Collider (LHC) have been shown to disagree with fixed-order non-relativistic quantum chromodynamics (NRQCD) calculations, even at higher orders. Calculations using the fragmenting jet function formalism are able to better describe data but cannot provide full event-level predictions. In this work we provide an alternative model via NRQCD production of quarkonia in a timelike parton shower. We include this model in thePythia 8 event generator and validate our parton-shower implementation against analytic forms of the relevant fragmentation functions. Finally, we make inclusive predictions of quarkonia production for the decay of the standard-model Higgs boson.more » « less
-
This work reports on a method for uncertainty estimation in simulated collider-event predictions. The method is based on a Monte Carlo-veto algorithm, and extends previous work on uncertainty estimates in parton showers by including uncertainty estimates for the Lund string-fragmentation model. This method is advantageous from the perspective of simulation costs: a single ensemble of generated events can be reinterpreted as though it was obtained using a different set of input parameters, where each event now is accompanied with a corresponding weight. This allows for a robust exploration of the uncertainties arising from the choice of input model parameters, without the need to rerun full simulation pipelines for each input parameter choice. Such explorations are important when determining the sensitivities of precision physics measurements. Accompanying code is available at https://gitlab.com/uchep/mlhad-weights-validation.more » « less
-
This manual describes the Pythia event generator, the most recent version of an evolving physics tool used to answer fundamental questions in particle physics. The program is most often used to generate high-energy-physics collision "events", i.e. sets of particles produced in association with the collision of two incoming high-energy particles, but has several uses beyond that. The guiding philosophy is to produce and re-produce properties of experimentally obtained collisions as accurately as possible. The program includes a wide ranges of reactions within and beyond the Standard Model, and extending to heavy ion physics. Emphasis is put on phenomena where strong interactions play a major role. The manual contains both pedagogical and practical components. All included physics models are described in enough detail to allow the user to obtain a cursory overview of used assumptions and approximations, enabling an informed evaluation of the program output. A number of the most central algorithms are described in enough detail that the main results of the program can be reproduced independently, allowing further development of existing models or the addition of new ones. Finally, a chapter dedicated fully to the user is included towards the end, providing pedagogical examples of standard use cases, and a detailed description of a number of external interfaces. The program code, the online manual, and the latest version of this print manual can be found on the Pythia web page: https://www.pythia.org/.more » « less
-
Abstract In this work, we consider the case of a strongly coupled dark/hidden sector, which extends the Standard Model (SM) by adding an additional non-Abelian gauge group. These extensions generally contain matter fields, much like the SM quarks, and gauge fields similar to the SM gluons. We focus on the exploration of such sectors where the dark particles are produced at the LHC through a portal and undergo rapid hadronization within the dark sector before decaying back, at least in part and potentially with sizeable lifetimes, to SM particles, giving a range of possibly spectacular signatures such as emerging or semi-visible jets. Other, non-QCD-like scenarios leading to soft unclustered energy patterns or glueballs are also discussed. After a review of the theory, existing benchmarks and constraints, this work addresses how to build consistent benchmarks from the underlying physical parameters and present new developments for the pythia Hidden Valley module, along with jet substructure studies. Finally, a series of improved search strategies is presented in order to pave the way for a better exploration of the dark showers at the LHC.more » « less
-
Abstract Particles beyond the Standard Model (SM) can generically have lifetimes that are long compared to SM particles at the weak scale. When produced at experiments such as the Large Hadron Collider (LHC) at CERN, these long-lived particles (LLPs) can decay far from the interaction vertex of the primary proton–proton collision. Such LLP signatures are distinct from those of promptly decaying particles that are targeted by the majority of searches for new physics at the LHC, often requiring customized techniques to identify, for example, significantly displaced decay vertices, tracks with atypical properties, and short track segments. Given their non-standard nature, a comprehensive overview of LLP signatures at the LHC is beneficial to ensure that possible avenues of the discovery of new physics are not overlooked. Here we report on the joint work of a community of theorists and experimentalists with the ATLAS, CMS, and LHCb experiments—as well as those working on dedicated experiments such as MoEDAL, milliQan, MATHUSLA, CODEX-b, and FASER—to survey the current state of LLP searches at the LHC, and to chart a path for the development of LLP searches into the future, both in the upcoming Run 3 and at the high-luminosity LHC. The work is organized around the current and future potential capabilities of LHC experiments to generally discover new LLPs, and takes a signature-based approach to surveying classes of models that give rise to LLPs rather than emphasizing any particular theory motivation. We develop a set of simplified models; assess the coverage of current searches; document known, often unexpected backgrounds; explore the capabilities of proposed detector upgrades; provide recommendations for the presentation of search results; and look towards the newest frontiers, namely high-multiplicity ‘dark showers’, highlighting opportunities for expanding the LHC reach for these signals.more » « less
-
Abstract Many measurements at the LHC require efficient identification of heavy-flavour jets, i.e. jets originating from bottom (b) or charm (c) quarks. An overview of the algorithms used to identify c jets is described and a novel method to calibrate them is presented. This new method adjusts the entire distributions of the outputs obtained when the algorithms are applied to jets of different flavours. It is based on an iterative approach exploiting three distinct control regions that are enriched with either b jets, c jets, or light-flavour and gluon jets. Results are presented in the form of correction factors evaluated using proton-proton collision data with an integrated luminosity of 41.5 fb -1 at √s = 13 TeV, collected by the CMS experiment in 2017. The closure of the method is tested by applying the measured correction factors on simulated data sets and checking the agreement between the adjusted simulation and collision data. Furthermore, a validation is performed by testing the method on pseudodata, which emulate various mismodelling conditions. The calibrated results enable the use of the full distributions of heavy-flavour identification algorithm outputs, e.g. as inputs to machine-learning models. Thus, they are expected to increase the sensitivity of future physics analyses.more » « less
An official website of the United States government
